Neural data-to-text generation with dynamic content planning
نویسندگان
چکیده
Neural data-to-text generation models have achieved significant advancement in recent years. However, these two shortcomings: the generated texts tend to miss some vital information, and they often generate descriptions that are not consistent with structured input data. To alleviate problems, we propose a model Dynamic content Planning, named NDP 2for abbreviation. The can utilize previously text dynamically select appropriate entry from given We further design reconstruction mechanism novel objective function reconstruct whole of used data sequentially hidden states decoder, which aids accuracy text. Empirical results show achieves superior performance over state-of-the-art on ROTOWIRE NBAZHN datasets, terms relation (RG), selection (CS), ordering (CO) BLEU metrics. human evaluation result shows by proposed better than corresponding ones NCP most time. And using mechanism, fidelity be improved significantly.
منابع مشابه
Order-Planning Neural Text Generation From Structured Data
Generating texts from structured data (e.g., a table) is important for various natural language processing tasks such as question answering and dialog systems. In recent studies, researchers use neural language models and encoder-decoder frameworks for table-to-text generation. However, these neural network-based approaches do not model the order of contents during text generation. When a human...
متن کاملAn Object Oriented Approach to Content Planning for Text Generation
I'his paper describes GENIE, an object-oriented architecture that generates text with the intent of extending user expertise in interactive environments. Such environments present three interesting goals. First, to provide information within the task at hand. Second to both respond to a user's task related question and simultaneously extend their knowledge. Third, to do this in a manner that is...
متن کاملBuilding RDF Content for Data-to-Text Generation
In Natural Language Generation (NLG), one important limitation is the lack of common benchmarks on which to train, evaluate and compare data-to-text generators. In this paper, we make one step in that direction and introduce a method for automatically creating an arbitrary large repertoire of data units that could serve as input for generation. Using both automated metrics and a human evaluatio...
متن کاملMachine Comprehension by Text-to-Text Neural Question Generation
We propose a recurrent neural model that generates natural-language questions from documents, conditioned on answers. We show how to train the model using a combination of supervised and reinforcement learning. After teacher forcing for standard maximum likelihood training, we fine-tune the model using policy gradient techniques to maximize several rewards that measure question quality. Most no...
متن کاملNeural Text Generation from Structured Data with Application to the Biography Domain
This paper introduces a neural model for concept-to-text generation that scales to large, rich domains. It generates biographical sentences from fact tables on a new dataset of biographies from Wikipedia. This set is an order of magnitude larger than existing resources with over 700k samples and a 400k vocabulary. Our model builds on conditional neural language models for text generation. To de...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
ژورنال
عنوان ژورنال: Knowledge Based Systems
سال: 2021
ISSN: ['1872-7409', '0950-7051']
DOI: https://doi.org/10.1016/j.knosys.2020.106610